Valid Post-Selection Inference

نویسندگان

  • RICHARD BERK
  • LAWRENCE BROWN
  • ANDREAS BUJA
  • KAI ZHANG
  • LINDA ZHAO
چکیده

It is common practice in statistical data analysis to perform data-driven variable selection and derive statistical inference from the resulting model. Such inference enjoys none of the guarantees that classical statistical theory provides for tests and confidence intervals when the model has been chosen a priori. We propose to produce valid “post-selection inference” by reducing the problem to one of simultaneous inference and hence suitably widening conventional confidence and retention intervals. Simultaneity is required for all linear functions that arise as coefficient estimates in all submodels. By purchasing “simultaneity insurance” for all possible submodels, the resulting post-selection inference is rendered universally valid under all possible model selection procedures. This inference is therefore generally conservative for particular selection procedures, but it is always less conservative than full Scheffé protection. Importantly it does not depend on the truth of the selected submodel, and hence it produces valid inference even in wrong models. We describe the structure of the simultaneous inference problem and give some asymptotic results.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Valid Post-Selection and Post-Regularization Inference: An Elementary, General Approach

Here we present an expository, general analysis of valid post-selection or post-regularization inference about a low-dimensional target parameter, α, in the presence of a very high-dimensional nuisance parameter, η, which is estimated using modern selection or regularization methods. Our analysis relies on high-level, easy-to-interpret conditions that allow one to clearly see the structures nee...

متن کامل

POST - SELECTION INFERENCE By Richard Berk

It is common practice in statistical data analysis to perform datadriven variable selection and derive statistical inference from the resulting model. Such inference enjoys none of the guarantees that classical statistical theory provides for tests and confidence intervals when the model has been chosen a priori. We propose to produce valid “post-selection inference” by reducing the problem to ...

متن کامل

POST - SELECTION INFERENCE By Richard

It is common practice in statistical data analysis to perform datadriven variable selection and derive statistical inference from the resulting model. Such inference enjoys none of the guarantees that classical statistical theory provides for tests and confidence intervals when the model has been chosen a priori. We propose to produce valid “post-selection inference” by reducing the problem to ...

متن کامل

Post - Selection Inference

It is common practice in statistical data analysis to perform datadriven variable selection and derive statistical inference from the resulting model. Such inference enjoys none of the guarantees that classical statistical theory provides for tests and confidence intervals when the model has been chosen a priori. We propose to produce valid “post-selection inference” by reducing the problem to ...

متن کامل

Post-selection inference for l1-penalized likelihood models

According to the article[2], we present a new method for post-selection inference for l1(lasso)penalized likelihood models, including generalized regression models. Our approach generalizes the post-selection framework presented in Lee et al. (2013)[1]. The method provides P-values and confidence intervals that are asymptotically valid, conditional on the inherent selection done by the lasso. W...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2011